Bagging, Boosting, and Bloating in Genetic Programming

نویسنده

  • Hitoshi Iba
چکیده

We present an extension of GP (Genetic Programming) by means of resampling techniques, i.e., Bagging and Boosting. These methods both manipulate the training data in order to improve the learning algorithm. In theory they can signi cantly reduce the error of any weak learning algorithm by repeatedly running it. This paper extends GP by dividing a whole population into a set of subpopulations, each of which is evolvable by using the Bagging and Boosting methods. The e ectiveness of our approach is shown by experiments. The performance is discussed by the comparison with the traditional GP in view of the bloating e ect.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Improving reservoir rock classification in heterogeneous carbonates using boosting and bagging strategies: A case study of early Triassic carbonates of coastal Fars, south Iran

An accurate reservoir characterization is a crucial task for the development of quantitative geological models and reservoir simulation. In the present research work, a novel view is presented on the reservoir characterization using the advantages of thin section image analysis and intelligent classification algorithms. The proposed methodology comprises three main steps. First, four classes of...

متن کامل

Cellular Genetic Programming with Bagging and Boosting for the Data Mining Classification task

Genetic programming (GP )[14] is a general purpose method that has been successfully applied to solve problems in different application domains. In the data mining field [8], GP has showed to be a particularly suitable technique to deal with the task of data classification [12, 9, 10, 11] by evolving decision trees. Many data mining applications manage databases consisting of a very large numbe...

متن کامل

Genetic Programming of Heterogeneous Ensembles for Classification

The ensemble classification paradigm is an effective way to improve the performance and stability of individual predictors. Many ways to build ensembles have been proposed so far, most notably bagging and boosting based techniques. Evolutionary algorithms (EAs) also have been widely used to generate ensembles. In the context of heterogeneous ensembles EAs have been successfully used to adjust w...

متن کامل

The Role of Combining Rules in Bagging and Boosting

To improve weak classifiers bagging and boosting could be used. These techniques are based on combining classifiers. Usually, a simple majority vote or a weighted majority vote are used as combining rules in bagging and boosting. However, other combining rules such as mean, product and average are possible. In this paper, we study bagging and boosting in Linear Discriminant Analysis (LDA) and t...

متن کامل

A genetic approach for training diverse classifier ensembles

Classification is an active topic of Machine Learning. The most recent achievements in this domain suggest using ensembles of learners instead of a single classifier to improve classification accuracy. Comparisons between Bagging and Boosting show that classifier ensembles perform better when their members exhibit diversity, that is commit different errors. This paper proposes a genetic algorit...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1999